Your browser does not support the video tag. Please use IE9+ or Google Chrome.
Deep Learning More Techniques
(Lee Hung-yi, 2:29:01)
1. Tips for Training Deep Neural Network
2. Announcement
3. Outline
4. Outline
5. ReLU
6. ReLU
7. Review:Backpropagation
8. Review:Backpropagation
9. Problem of Sigmoid
10. Vanishing Gradient Problem
11. Vanishing Gradient Problem
12. ReLU
13. ReLU
14. ReLU
15. ReLU
16. ReLU
17. ReLU
18. ReLU
19. ReLU
20. ReLU - variant
21. Maxout
22. Maxout – ReLU is special case
23. Maxout – ReLU is special case
24. Maxout - Training
25. Maxout - Training
26. Outline
27. Cost Function
28. Output Layer
29. Softmax
30. Softmax
31. Softmax
32. Softmax
33. Softmax
34. Softmax
35. Normalizing Input
36. Outline
37. Vanilla Gradient Descent
38. Vanilla Gradient Descent
39. Outline
40. Outline
41. Learning Rates
42. Learning Rates
43. Adagrad
44. Adagrad
45. Adagrad
46. Contradiction?
47. Intuitive Reason
48. Larger gradient, larger steps?
49. Second Derivative
50. More than one parameters
51. What to do with Adagrad?
52. Outline
53. Easy to stuck
54. In physical world ……
55. Momentum
56. Momentum
57. Momentum
58. Slide 57
59. Slide 58
60. Slide 57
61. Slide 58
62. Outline
63. Panacea
64. Outline
65. Panacea
66. Outline
67. Outline
68. Early Stopping
69. Outline
70. Weight Decay
71. Weight Decay
72. Weight Decay
73. Weight Decay
74. Weight Decay
75. Weight Decay
76. Outline
77. Dropout
78. Dropout
79. Dropout
80. Dropout- Intuitive Reason
81. Dropout- Intuitive Reason
82. Dropout- Intuitive Reason
83. Dropout- Ensemble
84. Dropout- Ensemble
85. Dropout- Ensemble
86. Dropout- Ensemble
87. Dropout- Ensemble
88. Practical Suggestion for Dropout
89. Concluding Remarks
1. Tips for Training Deep Neural Network
2. Announcement
3. Outline
4. Outline
5. ReLU
6. ReLU
7. Review:Backpropagation
8. Review:Backpropagation
9. Problem of Sigmoid
10. Vanishing Gradient Problem
11. Vanishing Gradient Problem
12. ReLU
13. ReLU
14. ReLU
15. ReLU
16. ReLU
17. ReLU
18. ReLU
19. ReLU
20. ReLU - variant
21. Maxout
22. Maxout – ReLU is special case
23. Maxout – ReLU is special case
24. Maxout - Training
25. Maxout - Training
26. Outline
27. Cost Function
28. Output Layer
29. Softmax
30. Softmax
31. Softmax
32. Softmax
33. Softmax
34. Softmax
35. Normalizing Input
36. Outline
37. Vanilla Gradient Descent
38. Vanilla Gradient Descent
39. Outline
40. Outline
41. Learning Rates
42. Learning Rates
43. Adagrad
44. Adagrad
45. Adagrad
46. Contradiction?
47. Intuitive Reason
48. Larger gradient, larger steps?
49. Second Derivative
50. More than one parameters
51. What to do with Adagrad?
52. Outline
53. Easy to stuck
54. In physical world ……
55. Momentum
56. Momentum
57. Momentum
58. Slide 57
59. Slide 58
60. Slide 57
61. Slide 58
62. Outline
63. Panacea
64. Outline
65. Panacea
66. Outline
67. Outline
68. Early Stopping
69. Outline
70. Weight Decay
71. Weight Decay
72. Weight Decay
73. Weight Decay
74. Weight Decay
75. Weight Decay
76. Outline
77. Dropout
78. Dropout
79. Dropout
80. Dropout- Intuitive Reason
81. Dropout- Intuitive Reason
82. Dropout- Intuitive Reason
83. Dropout- Ensemble
84. Dropout- Ensemble
85. Dropout- Ensemble
86. Dropout- Ensemble
87. Dropout- Ensemble
88. Practical Suggestion for Dropout
89. Concluding Remarks
1
/
89
Volume
速度 :
0.25x
0.5x
1x
1.25x
1.5x
2x
2.5x
3x
4x
5x
畫質 :
1024 x 768
00:00
/
2:29:01
00:00
/
00:03